Search Results: "Roland Mas"

26 August 2007

Roland Mas: Gforge in Debian, August 2007

As I type this, debs for a current snapshot of upstream Subversion are on their way to Debian Sid. They are taken from the Subversion trunk and not from the yet-to-be-released 4.6 branch, because a few important changes have taken place on the trunk only, and the 4.6 branch is merged anyway. Hence the version number, 4.6.99+svn6078-1. These packages did spend some time in experimental, and I didn't get any bug reports, but that doesn't mean they're bug-free. Use with caution. One of the big changes is that Gforge now uses Gettext rather than its home-made internationalisation system. Which means probably fewer problems, and a more standard system allowing more people to get involved. Can you guess what I'm hinting at? Yes, it's a call for translations! Anyone interested in getting involved should bzr branch http://alioth.debian.org/~lolando/bzr/gforge/upstream-svn/trunk/, and start poking the existing *.po files (or creating new ones!). Why the private repo and not the upstream Subversion repository? Rationale follows. My job as a freelancer as well as my role as an Alioth admin involve maintaining separate branches of the Gforge code (different clients have different needs and patches). In order to facilitate patch migration, I therefore need a distributed VCS, and I've been using Bazaar for a few years, with a manual gatewaying between upstream CVS first, then Subversion, and a few private Bazaar branches. Now that Bazaar (as in Arch) is dead, I'm using Bazaar (as in Bazaar-NG, Bazaar-2, or bzr), which seems to be approaching a 1.0 release, and which provides a plugin to interoperate easily with Subversion repositories. And since the upstream Subversion repository is not accessible anonymously anyway, I decided to publish my gateway branch. I'll probably publish more branches as time passes (probably the Alioth branch, and quite possibly feature branches too). Note: if you want to share a bzr repository of a project containing PHP scripts with Apache, you may encounter a problem, because the bzr repository contains files named *.php.knit and *.php.kndx. And Apache will happily give these files to the PHP interpreter when serving them, and that's not what you want. My trick to fix that is to add a .htaccess file somewhere where the repository is stored, with the following contents:
AddHandler None .knit .kndx
This will ensure that these files will be sent straight to the HTTP client, and not through the PHP interpreter.

26 June 2007

Roland Mas: Handmade CPU scaling

It's like this. I've got this very fine computer, only it's about four years old. Contrary to popular belief, that doesn't mean it's too slow for today's computing, so I don't feel any particular need to change it or upgrade it (I've been known to use an 8-year-old computer as my main server up till recently). The problem is that its main processor is an Athlon XP 2200+, and that's not the "mobile" version. So there's no frequency scaling, no power-saving CPU states, and so on. And since I moved from a hot place (the French Riviera) to an even hotter place (near Montpellier), summer is likely going to cause frequent crashes as temperature rises. Oh, and my office/computer room is now right under the roof, so it does get hot rather fast. So, something has to be done. First solution (rather blunt): open up the computer case and point a large fan at it. It worked last summer, but it's hardly elegant, and it's noisy. Second solution: try to emulate the CPU frequency utilities with what's available. In this case, the CPU throttling feature accessible through /proc/acpi/processor/CPU1/throttling. I had a look around, but the usual suspects (cpufrequtils, powernowd and so on) didn't seem to be able to help. Hence the following script:
#! /bin/sh
high=55
low=50
minload=1
delay=60
sensorbus=w83697hf-isa-0290
sensor=temp2
maxlevel=15
while sleep $delay ; do
    temp=$(sensors $sensorbus \
          awk "/^$sensor:/  print \$2 " \
          sed s/^\+// \
          sed s/\\..*// )
    cur=$(cat /proc/acpi/processor/CPU1/throttling \
          awk '/^active state:/  print $3 ' \
          sed s/T// )
    load=$(awk ' print $2 ' /proc/loadavg   sed s/\\..*//)
    if [ $temp -ge $high ] && [ $cur -lt $maxlevel ] ; then
        next=$(( $cur + 1 ))
        logger "Temp $temp, load $load, throttling down from $cur to $next"
        echo $next > /proc/acpi/processor/CPU1/throttling
    elif [ $temp -le $low ] && [ $load -ge $minload ]  && [ $cur -gt 0 ] ; then
        next=$(( $cur - 1 ))
        logger "Temp $temp, load $load, throttling up from $cur to $next"
        echo $next > /proc/acpi/processor/CPU1/throttling
    fi
done
You'll notice that it unconditionnally throttles the CPU if the temperature is too high, and releases some of the power if the temperature is acceptably low and the system load seems to demand more power (no need to have the box running at full speed if it's idle). Feel free to re-use, comment, tweak, and so on.

5 June 2007

Martin-Éric Racine: Tip of the day: fetching all Debian source packages

A few weeks ago, the Linutop team received a request for its source code. We had already anticipated the GPL source offer clause in our development plan, so it was just a matter of myself getting around producing a source code ISO image. Piece of cake, right? Almost. You see, Linutop includes a custom kernel package and several separate wireless module packages, so the the following command would not work as expected:
apt-get --download-only --ignore-missing source $(dpkg --get-selections   cut -f 1)
Why is that? Because APT ignores the --ignore-missing option whenever executing the source command. Instead, Roland Mas suggested that I used this simple Bourne loop:
for p in $(dpkg --get-selections   cut -f 1) ; do apt-get --download-only source $p ; done
Done this way, the source code of each package is fetched individually and, if any package's source is not available, APT exits but the Bourne loop moves on to the next package in the list. Nifty, isn't it?

10 May 2007

Roland Mas: Alioth going down, then up

Just a quick announcement for those of you who don't read debian-devel-announce: alioth.debian.org will be upgraded from Sarge to Etch this coming Sunday (2007-05-13). You may notice some downtime, although hopefully not too much.

28 March 2007

Roland Mas: Camsync, now with pmount support

My little pictures-downloading tool just got a new feature: instead of relying on an automounter, it can now invoke pmount itself. That makes it easier to setup. It also allows you to unplug the camera as soon as the program is done running; no more waiting for the automounter to unmount it after a timeout. The Synchronisation scripts for digital cameras page has been updated accordingly.

12 March 2007

Roland Mas: Alioth goes Mercurial

For those who don't read debian-devel-announce: the Alioth team, with help from John Goerzen, has set up an infrastruct^W^Wa couple of scripts on Alioth, so that it's now hosting Mercurial repositories on hg.debian.org. Please read the AliothHg page on the Debian wiki for details, but don't expect many surprises: it'll work mostly the same way as the other VCSes hosted on Alioth.

9 February 2007

Roland Mas: Docbook-Slides to Docbook translator

I have a set of slides for an upcoming training session. They're written in Docbook-Slides, a customised version of Docbook for, well, presentation slides. That customisation is available for Debian in the docbook-slides package, which also contains a few XSL stylesheets to convert the slides into several HTML forms (with frames, without frames, with/without some Javascript helpers, etc.). Unfortunately, there isn't any proper way to turn these slides into something printable. You can get XSL-FO with other provided stylesheets, but since passivetex is no longer part of Debian, it's hard to get a PDF. Which is silly, since dblatex (as well as others, I'm sure) can turn plain Docbook into nice PDF files. So I took the time to read a fine XSLT tutorial, and came up with my very first stylesheet, which doesn't print out "Hello world" but rather turns a Docbook-Slides document into a Docbook article. Which dblatex is happy to convert to PDF for me. Let there be much rejoicing. I'm not sure it's a good idea to actually publish one's first snippet of any language, but I'll do it nevertheless, out of hope that I can avoid frustration for some people:
<?xml version="1.0"?>
<xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" version="1.0">
<xsl:template match="slides">
<article>
  <xsl:apply-templates select="@*"/>
  <xsl:apply-templates/>
</article>
</xsl:template>
<xsl:template match="slidesinfo">
<articleinfo>
  <xsl:apply-templates select="@*"/>
  <xsl:apply-templates/>
</articleinfo>
</xsl:template>
<xsl:template match="foilgroup foil">
<section>
  <xsl:apply-templates select="@*"/>
  <xsl:apply-templates/>
</section>
</xsl:template>
<xsl:template match="speakernotes" />
<xsl:template match="*">
  <xsl:element name=" name() ">
    <xsl:apply-templates select="@*"/>
    <xsl:apply-templates/>
  </xsl:element>
</xsl:template>
<xsl:template match="@*">
  <xsl:attribute name=" name() ">
    <xsl:value-of select="."/>
  </xsl:attribute>
</xsl:template>
</xsl:stylesheet>
Attentive readers will notice that, apart from slides and slidesinfo, only the foilgroup and foil elements are converted. In particular, the speakernotes elements are discarded, because I figure they have no place on the printed document. If you need them, beware. I think I understand what they mean by "XSLT is verbose".

7 February 2007

Roland Mas: Camsync, now with mass storage support

As already mentioned (Camsync is shaping up), I've been working on a script to automate part of the workflow of digital photography (the part where pictures go from the camera to the computer and get registered into the photo management program). This script was PTP-only so far, since my other, older, USB mass storage digital camera was broken. I eventually found the time and willpower to tackle it with the usual method: So I converted my Contax U4R back from an expensive SD-card reader into a working camera (even if the autofocus seems to be less accurate than it used to be). Anyway. My point is: I've updated my camsync script (and the related Synchronisation scripts for digital cameras page) so it can now handle USB mass storage cameras, as long as they can be accessed through an automounter. I haven't tried it on a memory card reader, but it may just work. As usual, feel free to use, hack, send patches, and report your experiences.

24 December 2006

Manoj Srivastava: Arch, Ikiwiki, blogging

One of the reasons I have only blogged 21 times in thirty months is because of the very kludgey work flow I had for blogging; I had to manually create the file, and then scp by hand, and ensure that any ancillary files were in place on the remote machine that serves up my blogs. After moving to ikiwiki, and thus arch, there would be even more overhead, were it not so amenable to scripting. Since this is arch, and therefore creating branches and merging is both easy and natural, I have two sets of branches -- one set related to the templates and actual blog content I server on my local, development box, and a parallel set of branches that I publish. The devel branches are used by ikiwiki on my local box; the remote ikiwiki uses the publish branch. So I can make changes to my hearts content on the devel branch, and the merge into my publish branch. When I commit the publish branches, the hook function ensure that there is a fresh checkout of the publish branch on the remote server, and that ikiwiki is run to regenerate web pages to reflect the new commit. The hook functions are nice, but not quite enough to make bloggin as effortless as it could be. With the movge to ikiwiki, and dissociation of classification and tagging from the file system layout, I have followed the lead of Roland Mas and organized my blog layout by date; posts are put in blog/$year/$month/$escaped_title. The directory hierarchy might not exist for a new year or month. A blog posting may also show in in two different archive indices: the annual archive index for the year, and a monthly index page created for every month I blog in. However, at the time of writing, there is no annual index for the next year (2007), or the next month (January 2007). These have to be created as required. All this would get quite tedious, and indeed, would frequently remain undone -- were it not for automation. To make my life easier, I have blogit!, which takes care of the niggling details. When called with the title of the prospective post; this script figures out the date, ensures that the blog directory structure exists, creating path components and adding them to the repository as required, creates a blog entry template, adds the blog entry to the repository, creates the annual or the monthly archive index and adds those to the repository as needed, and finally, calls emacs on the blog posting file. whew.

18 December 2006

Manoj Srivastava: I am now an Ikiwiki user!

Ikiwiki
Well, this is first post. I have managed to migrate my blog over to Ikiwiki, including all the historical posts. The reasons for migration was that development on my older blogging mechanism, Blosxom, entered a hiatus, though recently it has been revived on sourceforge. I like the fact that IkiWiki is based on a revision control system, and that I know the author pretty darned well :-). One of my primary requirements for the migration was that I be able to replicate all the functionality of my existing Blog, and this included the look and feel (which I do happen to like, despite wincing I see from some visitors to my pages) of my blog. This meant replicating the page template and CSS from my blog. I immediately ran into problems: for example, my CSS markup for my blogs was based on being able to markup components of the date of the entry (day, day of week, month, etc) and achieve fancy effects; and there was no easy way to use preexisting functionality of IkiWiki to present the information to the page template. Thus was born the varioki plugin; which attempts to provide a means to add variables for use in ikiwiki templates, based on variables set by the user in the ikiwiki configuration file. This is fairly powerful, allowing for uses like:
    varioki =>  
      'motto'    => '"Manoj\'s musings"',
      'toplvl'   => 'sub  return $page eq "index" ',
      'date'     => 'sub   return POSIX::strftime("%d", gmtime((stat(srcfile($pagesources $page )))[9]));  '
      'arrayvar' => '[0, 1, 2, 3]',
      'hashvar'  => ' 1, 1, 2, 2 '
     ,

The next major stumbling block was archive browsing for older postings; Blosxom has a nice calendar plugin that uses a calendar interface to let the user navigate to older blog postings. Since I really liked the way this looks, I set about scratching this itch as well; and now ikiwiki has attained parity vis. a vis. calendar plugins with Blosxom. The calendar plugin, and the archive index pages, led me start thinking about the physical layout of the blog entries on the file system. Since the tagging mechanism used in ikiwiki does not depend on the location in the file system (an improvement over my Blosxom system), I could layout the blog postings in a more logical fashion. I ended up taking Roland Mas' advice and arranging for the blog postings to be created in files like:
 blog/year/month/date/simple_title.mdwn

The archives contain annual and monthly indices, and the calendar front end provides links to recent postings and to recent monthly indices. So, a few additions to the arch hook scripts, and perhaps an script to automatically create the directory structure for new posts, and to automatically create annual and monthly indices as needed, and I'll have a low threshold of effort blogging work flow for blog entries, and I might manage to blog more often than the two blog postings I have had all through the year so far.

10 December 2006

Wouter Verhelst: New toy.

On thursday, I bought me a digital camera; a Nikon D50. That's the cheapest DSLR camera they had at the shop where I bought it—they had models for prices going up to just under 7000. The latter obviously is "slightly" out of budget for me, but this one isn't bad, at all. I took some pictures with it on friday, and yesterday I went to "De Oude Landen", a local (and rather unique) piece of nature, to try out my new camera. Most of the pictures I took then were crap due to bad lighting; I guess I need to read the manual a bit more. We'll see. In any case, it's a fun piece of machinery, and it seems my long-lost photography sk1llz are coming back now (I did a photography course when I was a 13-14yo). Love it. Roland Mas pointed out that f-spot is a great application to manage pictures, and I have to agree with him. Apart from the fact that it is a bit pedantically gnome-esque in its settings dialog ("settings, what's that? Oh, right, that's where you enable the screensaver"), it has a whole bunch of nice and interesting features, such as the ability to tag pictures, export them to CD or web-based applications, and more interesting stuff. A bit too much clickety-clickety, but then I guess it's impossible to design a good command-line based interface to manage graphical subject matters. Setting up gallery on sarge is not even half as hard as I'd feared, either. Long live the Debian apache team. So, yes, the above means I have a gallery online. Take a look if you care; I personally very much like this one.

7 December 2006

Roland Mas: Camsync is shaping up

Did I mention a few scripts I'm using to automagically download my digital pictures off the camera and into F-Spot? Yes, I think I did. I probably mentioned a large bunch of largely spaghetti shell-scripts, too. Fearsome. Well, fear no more! I've started a rewrite, a proper one, in Perl. With a configuration file and all. No camera models or file paths hardcoded. Probably faster, too, since I only fork a few gphoto2 processes in total (instead of one for each downloaded picture). Yes, gphoto2. I haven't reimplemented the mass-storage part yet since I mainly use my PTP camera these days. It may happen someday. I decided to call my beginning-of-a-program "Camsync". Then of course I found that this name was already taken for something similar. I'll contact the author of that something and see what happens. In the meantime, all you PTP-only camera owners may want to have a second look at sync-digicams — it's been updated.

10 November 2006

Roland Mas: Exif, give or take a few hours

I finally got myself a nice digital camera. Nice as in, it fits in a pocket, it takes reasonably good-looking pictures, ant it's not a Canon so I can access it as an USB mass storage device. (For those who care, it's a Kyocera/Contax U4R.) So I've started using F-Spot (which is tremendously cool) and Gallery a bit more intensively than previously, and I noticed something curious: my pictures had wrong times. I thought that digital cameras stored them inside the picture files... Hmm. On to investigate. Time passes as I follow the threads of the information flow (use the source, Luke). Then I scream in horror. Then I go see something I hope is authoritative and therefore likely to have been well thought-out (read the spec, Luke). Then I scream some more. Let it be widely known that the Exif specification, which is the standard for embedding metadata into image files from digital cameras, encodes the date and time of the picture as a string. A fat, ugly, 20-character string. More precisely, the format is YYYY:MM:DD HH:MM:SS. I don't have anything against strings, they are rather easy to parse by humans (as demonstrated by strings /home/roland/images/photos/contax-u4r/kicx0700.jpg head). But they should be done right. In this case, it would have made a lot of sense to enlarge that string to 26 characters and add +ZZZZ or something similar. So as to be able to encode, say, a timezone, for instance, in order to, say, have dates and times that actually mean something, instead of something that may or may not be the current time plus or minus a random number of hours and minutes. I thought I could cheat by starting F-Spot with TZ=UTC f-spot (among my electronic gadgets, those that can't cope with timezones are set to UTC, with the exception of my alarm clock). The result is bizarre: the date is shown as being one hour late. I could understand if it were two (I'm at UTC+2 these days), but where does that one come from? I suppose I'll have to cheat differently, by keeping a patched version of F-Spot that interprets the TimeDate data from Exif as UTC, instead of using the current timezone. Or maybe I could do that in my hotplug script: it already rsyncs images from the digicam to the harddisk then makes sure the digicam's memory has at least $foo megabytes of free space on the card memory. I suppose it could also fiddle with the Exif tags on the fly. Anyway. People need to learn: text without encoding information is useless, cryptography without a web of trust is useless, and date/times without timezone info is useless. If one of my esteemed readers happens to be (or know) a member of the Exif specification group (I understand it's called "JEITA"), please tell them that this needs to be fixed in a future version of the spec. I suppose I'll have to live with it with my current digicam, but maybe the next one will be better in that regard. The pictures themselves are nice, though.

Roland Mas: New face, new colours

No, I didn't get a haircut, nor did I dye my hair blue (yet). I just partly un-transparentified my X-Face-based hackergotchi for Planet Debian and added that oh-so-classy semi-translucent drop shadow. And I also harmonised the colour scheme of my blog with the one on my pro site (yes, I'm a part-time free software freelance). Again, using a colour theme inspired from Manoj's script and blog article. I'm an absolute fashion victim. Must... not... put... "hacker emblem"... somewhere.

Roland Mas: Three more packages

Jens Gecius has just dropped out of the Debian new maintainer process, citing time constraints. That means he's no longer maintaining his packages, and I have agreed to adopt them. Which is a nice turn: I'm happy to maintain them, and I'm very happy that someone did the initial packaging while I only had to review it (I sponsored the uploads up until now). I'm now the proud maintainer of the dvtitler, kinoplus and timfx source packages, all three being collections of plugins for Kino (a digital video editor small and functional enough for my needs), containing effects, filters, transitions and so on. I'm not viscerally attached to them, though, so if anyone wants them more than I do, just ask. Now if the udev, sysfs and ieee1394 drivers authors could sit around a table and fix things so that the appropriate raw1394/video1394/dv1394 device nodes appear in /dev when I plug in my DV camcorder, that would be even nicer. I'm currently having to reboot to Linux 2.4 to grab the video out of it and send it back.

Roland Mas: Timfx 0.2.2-1

Jens wants me to sponsor timfx 0.2.2-1, which fixes an FTBFS bug. Unfortunately, it also removes the automatic updating of the config. sub,guess files. Since it's an useful feature, I told him to un-remove that, then I'll upload 0.2.2-2.

Roland Mas: ADSL upgrade part 1

It seems my ADSL link has now been upgraded to 2048k/128k (from 1024k/128k previously, from 512k/128k even more previously). All that for 30 /month. That's the first part of coolness. I'm still waiting for the second part: the switch to an "unbundled" line, which should bump stuff to 6M/512k. Not sure I'll ever use the 6M part, but the 512k uplink would definitely be a bonus. I host stuff here, including pictures, a blog and a couple of websites... Oh, and dput does take far too long already.

Roland Mas: Geekish week-end

So I spent most of the (long) week-end hacking on various things, and behaving geekishly. Gforge 4.0 packaging goes along rather nicely. Packages built from current CVS won't use LDAP anymore (except for the old gforge-mta-exim, but you really should be using gforge-mta-exim4 or -mta-postfix by now). Shell accounts, e-mail redirection and mailing-list resolution now use direct PostgreSQL lookups. No more install-ldap.sh scripts running hourly and taking forever. No more nightmares setting up slapd and trying to clean after it when it explodes. Yay! Other news, on the video front: okay, I gave in, I finally accepted the next-best-thing that so many people have sent me. My raw1394 and dv1394/host0/PAL/out device nodes are now statically created by me, and symlinked by udevd via links.conf. Urgh. It means I can grab video from the camcorder, edit it, and export it back to the camcorder, all that in the same Kino session (and under a reasonably current 2.6 kernel). Editing video is nice, but it's not very useful if you have to carry your camcorder along at all times to show people, though. So I wanted/needed a free, working encoder for a nice codec that I could use. I wanted Theora. I whined for sometime, and I finally understood that Kino wouldn't have anything to do with it, and I had to use ffmpeg2theora. Blargh. Whatever, I grabbed it from upstream and packaged it (it's at http://roland.mas.free.fr/debian/ if you want it -- someone please ITP it, take it, write a manpage, and upload it to Debian). Now Kino can export to Theora, cool. Now I just have to find one media player able to play Theora correctly, and things will be fine. Xine only wants to play the Vorbis (audio) track, and shows me a visual representation of it. VLC is fine after like fifteen seconds, but it starts by playing audio too fast, then slows down, while video takes long before coming, then it comes up too fast and slows down to a normal speed too. As for totem-gstreamer... It starts by playing audio fine, but video is about 5-6 times too slow for as long as there is sound to play. When the end of the audio track is reached, all the video is played as fast as possible (and in total silence). Oh, and I also got my ass kicked at Freeciv. I used to play Civilisation for days in the good old Atari ST days, but I seem to have forgotten quite a lot. Or maybe I was no good to start with, who knows, the AIs never told me. So I need to practice against AIs, I suppose, before I next connect to a game server. Okay, so I spent the whole weekend on a computer. So what? It was raining anyway.

8 November 2006

Roland Mas: Adding some non-blog pages

This blog/website is (currently) powered by Ikiwiki. That means I can rather easily add non-blog pages for content that's not quite "news", yet deserves a location more visible than some ~/public_html/tmp/ directory. I'm therefore starting the /static/ section of the site. Its first (and, so far, only) page is one giving some details about my digital camera setup. I may add some more pages as time passes. Not sure yet whether I'll make an index or let people find the pages through a search engine or the links.

1 November 2006

Roland Mas: New digital camera and related setup

I recently bought this lovely Canon EOS 350D camera from a friend who bought the new 400D instead. It's a nice DSLR camera, which I'm convinced is able to take magnificent pictures when used appropriately (my friend's pics are a good example). So since I like things to Just Work, I updated my image downloading script. It used to just do an rsync since my previous camera was available as an USB mass storage device, which I could simply mount with autofs. But the Canon guys still don't make cameras that work that way, and one has to use the PTP protocol instead. Which means rsync isn't an option, and one has to do shell and awk magic to retrieve all the pictures. Because the camera doesn't even put all images in the same folder, but it changes folders once every 100 shots, so you have to parse the output from gphoto2 and get the pics one by one. Anyway. It works, and it even registers the pictures in the F-Spot database, with a tag on them (with a semantic of "TODO" or "unsorted"). If anyone wants the scripts, just give me a shout. Speaking of F-Spot: the new 0.2.2 version wasn't in Debian unstable yet, and the boolean logic queries were very appealing to me. So I took the liberty of preparing an unofficial package, which you'll find in my APT repository until official packages enter Sid. Oh, and since my friend's new EOS 400D isn't supported by the latest released libgphoto2 packages, I also patched them (one-line patch, to add a pair of USB ids) and uploaded them to the same place. Same thing, unofficial packages, use at your own risk, etc. He tells me there's a versioned dependency on dbus that can't be satisfied in testing, but that should be a temporary problem. That was the easy part. Now, the hard part is: learning how to do good-looking photos. I expect that'll take much longer.

Next.

Previous.